Web Survey Bibliography
The utility of an archived survey mainly depends on the quality and elaborateness of its documentation. This holds true for traditional-static and computer-assisted data collection and especially for online surveys. Online surveys allow for documentation standards which exceed the needs in classical paper-based surveys. Hitherto known documentation standards for offline conducted surveys were evaluated with online surveys in mind. In a collaboration of the two GESIS institutes ZU MA and ZA these standards were revised and, if necessary, extended for their use in online surveys. New demands arise especially because (a) partial completions (dropouts) remain in the dataset, and (b) variations in the visual design affect the answers. We mayaiso add (c) particularities for online surveys which are based on or include experimental designs. Negative effects in online surveys (e.g. high non-response) are often caused by inadequate and poor operationalisations of the questionnaire. To allow for an assessment of any research result it is necessary to provide information on the operationalisations for the reader. Similarly, a description of the flow of the questionnaire (filter paths) is needed. All the same, the layout and design of single questions should be documented in online surveys, as they are known to influence answer behavior.
The resulting guidelines for documentation practice in online surveys therefore focus on the following aspects:
1) Coding of missing data. Our proposed scheme for missing data documentation combines two known standards, clears it from unnecessary irregularities and extends it to the area of online surveys.
2) Codebook. We propose an extended version of the classical codebook with simple frequency distributions, information on the page number and the display of the questions.
3) Process documentation. We explain how the questionnaire flow and controls may be described.
4) Variable labels, The well known label approach prefix-stem-suffix is explicated for online surveys.
Examples are used to illustrate and discuss the guidelines.
Der Nutzen einer archivierten Befragung ist im Wesentlichen abhangig von der Gute und Detailliertheit ihrer Dokumentation. Dies gilt fOr traditionell-statische und computergestotzte Erhebungen, insbesondere aber auch fOr Befragungen, die mittels des Internet realisiert werden. Im Vergleich zu Papier-gestotzten Erhebungen ermoglichen Online-Befragungen den Einsatz von Dokumentationsstandards, die weit Uber das Bisherige hinausgehen.
ln einer Zusammenarbeit der beiden GESIS-Institute ZUMA und ZA wurden die gangigen Dokumentationsstandards fOr offline durchgefOhrte Erhebungen im Hinblick auf ihre Tauglichkeit in Online-Befragungen UberprUft und, soweit dies notwendig erschien, erweitert. Neue Anforderungen entstehen vor allem dadurch, dass (a) unvollstandige Interviews in den Datensatz gelangen (Dropouts) und (b) den zahlreichen Variationsmoglichkeiten im visuellen Design, die zu speziellen Antworteffekten fOhren konnen. Hinzu kommen (c) Besonderheiten, sofern den Online-Befragungen experimentelle Designs zugrunde liegen.
Bei Online-Befragungen lassen sich negative Effekte wie z.B. ein hoher Nonresponse haufig auf mangelhafte Operationalisierungen zurUckfOhren. Daher sollte zur Bewertung der Forschungsergebnisse die Operationalisierung fOr den Leser verfUgbar sein. Damit einhergehend ist zu einer genaueren Dokumentation von Abbruchquoten eine Beschreibung des Fragebogenablaufs und der Fragebogensteuerung notwendig. Gleichwohl sollten gerade im Online-Bereich Design und Layout einzelner Fragetypen dokumentiert werden, da diese einen Einfluss auf das Antwortverhalten ausUben konnen.
Die entstandenen Richtlinien zur Dokumentationspraxis legen ihre Schwerpunkte daher auf die folgenden Aspekte:
1) Codierung fehlender Angaben: Hierbei stehen die Bereinigung von Irregularitaten in der Codevergabe, die generelIe GUltigkeit bei verschiedensten Befragungen, sowie die Erweiterung auf den Online-Bereich im Vordergrund.
2) Codebuch: Erweiterung des klassischen Codebuchs um Haufigkeitstabellen, Angaben zur Befragungsseite, sowie der verwendeten Darstellungsform.
3) Prozessdokumentation: Darstellung des Fragebogenverlaufs und der -steuerung.
4) Variablenbezeichnungen: Explizierung der bekannten Dreiteilung PrefixStamm-Suffix auf Online-Befragungen.
Anhand von Beispielen sollen die hier entwickelten Standards vorgestellt und zur Diskussion gestelit werden.
Abbruchquoten eine Beschreibung des Fragebogenablaufs und der Fragebogensteuerung notwendig. Gleichwohl sollten gerade im Online-Bereich Design und Layout einzelner Fragetypen dokumentiert werden, da diese einen Einfluss auf das Antwortverhalten ausUben konnen.
Web survey bibliography - Germany (361)
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- Comparison of response patterns in different survey designs: a longitudinal panel with mixed-mode and...; 2017; Ruebsamen, N.; Akmatov, M. K.; Castell, S.; Karch, A.; Mikolajczyk, R. T.
- Mobile Research im Kontext der digitalen Transformation; 2017; Friedrich-Freksa, M.
- Kognitives Pretesting; 2017; Neuert, C.
- Grundzüge des Datenschutzrechts und aktuelle Datenschutzprobleme in der Markt- und Sozialforschung; 2017; Schweizer, A.
- Article Establishing an Open Probability-Based Mixed-Mode Panel of the General Population in Germany...; 2017; Bosnjak, M.; Dannwolf, T.; Enderle, T.; Schaurer, I.; Struminskaya, B.; Tanner, A.; Weyandt, K.
- Socially Desirable Responding in Web-Based Questionnaires: A Meta-Analytic Review of the Candor Hypothesis...; 2016; Gnambs, T.; Kaspar, K.
- Methodological Aspects of Central Left-Right Scale Placement in a Cross-national Perspective; 2016; Scholz, E.; Zuell, C.
- Predicting and Preventing Break-Offs in Web Surveys; 2016; Mittereder, F.
- Incorporating eye tracking into cognitive interviewing to pretest survey questions; 2016; Neuert, C.; Lenzner, T.
- Geht’s auch mit der Maus? – Eine Methodenstudie zu Online-Befragungen in der Jugendforschung...; 2016; Heim, R.; Konowalczyk, S.; Grgic, M.; Seyda, M.; Burrmann, U.; Rauschenbach, T.
- Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?; 2016; Meitinger, K., Behr, D.
- Device Effects - How different screen sizes affect answers in online surveys; 2016; Fisher, B.; Bernet, F.
- Effects of motivating question types with graphical support in multi channel design studies; 2016; Luetters, H.; Friedrich-Freksa, M.; Vitt, SGoldstein, D. G.
- Analyzing Cognitive Burden of Survey Questions with Paradata: A Web Survey Experiment; 2016; Hoehne, J. K.; Schlosser, S.; Krebs, D.
- Secondary Respondent Consent in the German Family Panel; 2016; Schmiedeberg, C.; Castiglioni, L.; Schroeder, J.
- Does Changing Monetary Incentive Schemes in Panel Studies Affect Cooperation? A Quasi-experiment on...; 2016; Schaurer, I.; Bosnjak, M.
- Using Cash Incentives to Help Recruitment in a Probability Based Web Panel: The Effects on Sign Up Rates...; 2016; Krieger, U.
- The Mobile Web Only Population: Socio-demographic Characteristics and Potential Bias ; 2016; Fuchs, M.; Metzler, A.
- The Impact of Scale Direction, Alignment and Length on Responses to Rating Scale Questions in a Web...; 2016; Keusch, F.; Liu, M.; Yan, T.
- Web Surveys Versus Other Survey Modes: An Updated Meta-analysis Comparing Response Rates ; 2016; Wengrzik, J.; Bosnjak, M.; Lozar Manfreda, K.
- Retrospective Measurement of Students’ Extracurricular Activities with a Self-administered Calendar...; 2016; Furthmueller, P.
- Privacy Concerns in Responses to Sensitive Questions. A Survey Experiment on the Influence of Numeric...; 2016; Bader, F., Bauer, J., Kroher, M., Riordan, P.
- Ballpoint Pens as Incentives with Mail Questionnaires – Results of a Survey Experiment; 2016; Heise, M.
- Does survey mode matter for studying electoral behaviour? Evidence from the 2009 German Longitudinal...; 2016; Bytzek, E.; Bieber, I. E.
- Forecasting proportional representation elections from non-representative expectation surveys; 2016; Graefe, A.
- Setting Up an Online Panel Representative of the General Population The German Internet Panel; 2016; Blom, A. G.; Gathmann, C.; Krieger, U.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- Stable Relationships, Stable Participation? The Effects of Partnership Dissolution and Changes in Relationship...; 2016; Mueller, B.; Castiglioni, L.
- Will They Stay or Will They Go? Personality Predictors of Dropout in Online Study; 2016; Nestler, S.; Thielsch, M.; Vasilev, E.; Back, M.
- Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments; 2016; Struminskaya, B.
- A Privacy-Friendly Method to Reward Participants of Online-Surveys; 2015; Herfert, M.; Lange, B.; Selzer, A.; Waldmann, U.
- The impact of frequency rating scale formats on the measurement of latent variables in web surveys -...; 2015; Menold, N.; Kemper, C. J.
- Investigating response order effects in web surveys using eye tracking; 2015; Karem Hoehne, J.; Lenzner, T.
- Implementation of the forced answering option within online surveys: Do higher item response rates come...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- Translating Answers to Open-ended Survey Questions in Cross-cultural Research: A Case Study on the Interplay...; 2015; Behr, D.
- The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability...; 2015; Bosnjak, M.; Struminskaya, B.; Weyandt, K.
- Are they willing to use the web? First results of a possible switch from PAPI to CAPI/CAWI in an establishment...; 2015; Ellguth, P.; Kohaut, S.
- Measuring Political Knowledge in Web-Based Surveys: An Experimental Validation of Visual Versus Verbal...; 2015; Munzert, S.; Selb, P.
- Changing from CAPI to CAWI in an ongoing household panel - experiences from the German Socio-Economic...; 2015; Schupp, J.; Sassenroth, D.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- Mode System Effects in an Online Panel Study: Comparing a Probability-based Online Panel with two Face...; 2015; Struminskaya, B.; De Leeuw, E. D.; Kaczmirek, L.
- Higher response rates at the expense of validity? Consequences of the implementation of the ‘forced...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- A quasi-experiment on effects of prepaid versus promised incentives on participation in a probability...; 2015; Schaurer, I.; Bosnjak, M.
- Response Effects of Prenotification, Prepaid Cash, Prepaid Vouchers, and Postpaid Vouchers: An Experimental...; 2015; van Veen, F.; Goeritz, A.; Sattler, S.
- Recruiting Respondents for a Mobile Phone Panel: The Impact of Recruitment Question Wording on Cooperation...; 2015; Busse, B.; Fuchs, M.
- The Influence of the Answer Box Size on Item Nonresponse to Open-Ended Questions in a Web Survey ; 2015; Zuell, C.; Menold, N.; Koerber, S.